Savannah State University
Fifth-Year Interim Report
Previous Page Next Page

CS 3.3.1.1 Institutional Effectiveness: Educational Programs

The institution identifies outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in educational programs, to include student learning outcomes.

Compliance                     ___ Non-Compliance

Savannah State University (SSU) is in compliance with Comprehensive Standard 3.3.1.1.  Each enrolled academic degree program has identified program outcomes and program student learning outcomes, developed assessments to measure performance, collected data, analyzed the degree of success in achieving outcome targets, and documented use of data to support enhancements to programs and student learning opportunities.

The following narrative contains: (1) an overview of SSU's process for assessing educational programs; (2) a guide to assessment reports and their collection, which provides the primary evidence for compliance with CS 3.3.1.1; and (3) an indication of next steps to uphold the practice of continuous improvement for assessment activities at SSU.

Section 1: Overview of SSU's Process for Assessing Educational Programs, Including Program Outcomes & Program Student Learning Outcomes

The Evolution of Assessment at SSU

Following the SACSCOC reaffirmation of accreditation in spring 2011, there was synergy around the newly approved QEP, The Write Attitude. In cooperation with the Office of the QEP, faculty began to engage in the direct assessment of students' writing ability. This focus on written communication as the student learning outcome for the QEP led to a broader campus discussion regarding the development of institutional-level outcomes. This extension of the focus of the QEP catalyzed the development of institutional assessment efforts that emphasized the establishment and measurement of direct, authentic evaluation of student writing on campus across the core curriculum and in degree-granting programs.  This momentum around institution-wide assessment through the QEP gave rise to SSU's adoption of five additional Institutional Student Learning Outcomes (ISLOs), characteristic of the foundational general educational experiences of SSU students, regardless of their major area of study.  SSU ISLOs include: (1) written communication (the specific outcome articulated by the QEP); (2) critical thinking; (3) information literacy; (4) quantitative literacy; (5) ethical reasoning; and (6) integrative learning. [1]   Beginning in fall 2013, SSU aligned the new ISLOs with Program Student Learning Outcomes (PSLOs). Once aligned, SSU was then able to return its academic assessment focus to monitoring student learning and program outcomes at the major level. As a result, in very short order, SSU implemented an authentic, course-embedded assessment strategy that simultaneously examined (1) written communication for the QEP (see QEP Impact Report for more details); (2) the five other ISLOs for the institution, and (3) PSLOs – including, but not limited to the translation of ISLOs to discipline-specific contexts (e.g., assessing critical thinking within Political Science). This alignment of course-, program-, and institutional-level outcomes and assessment efforts, represented a more robust and improved assessment practice on campus. Figure 1 represents the institutional developments of these assessment practices over the past few years.

 

Figure 1: Evolution of SSU's Approach to Assessing Student Learning

 

Program-Level Assessment at SSU

Consistent with SACSCOC Standard 3.3.1.1., program-level assessment at SSU examines (1) Program Outcomes (POs), which emphasize general features of the program and the educational environment created by the program (e.g., number of tenure-track faculty, number of publications per faculty, number of students per program, student completion rates, instructional delivery development, program outreach and visibility efforts, etc.); and, (2) Program Student Learning Outcomes (PSLOs), which directly relate to what learners know or can do upon successful completion of the program curriculum. The assessment process for educational programs at Savannah State University guides all those ultimately responsible for student learning – from those in leadership positions like deans and department chairs to those most directly engaged with student learning like program assessment coordinators and the faculty teaching the courses – in the comprehensive evaluation of all program and student learning outcomes in alignment with SSU's mission, vision, and ISLOs. Eachcollege/schooland academicdepartment isresponsiblef or supporting the development andimplementation of the systematic assessment ofall POs and PSLOs in each ofits academic programs.Itis thetask of thefaculty in eachacademicprogram to create and execute aplan to conduct annual, systematic assessment of their PSLOs, toevaluate assessment results,and,when appropriate, toimplementchanges designed toimprove student achievement of the learning outcomes. It is the institutional expectation that the effectiveness of any changes are also monitored.  The aim is the constant improvement of academic programs, student learning, and the evaluation of both at the institution.

The process for assessing educational programs follows a consistent assessment timeline, and this timeline is documented in SSU's Institutional Assessment of Student Learning Annual Calendar . Since 2013, the institution has operated according to an annual assessment cycle, with activities organized by semester (see Figure 2 ).  This approach ensures (1) the institution has continuous evidence of assessment practices on campus; (2) programs have the ability to review the efficacy of their offerings for graduates in a timely manner, so that actions can be taken to make changes when necessary, as evidence emerges; and to that end, (3) program faculty regularly reflect and improve upon effective teaching approaches. Further,t here is the expectation that programs organize their assessment planning to measure each PSLO at least twice in a three-year period, with the most recent period spanning fall 2013 – spring 2016.  It is recommended that a ll PSLOs are assessed three times in the three year period to obtain a baseline, an intervention, and a test of the efficacy of the intervention over time.  Ultimately, over the course of a three-year period, programs should generate three assessment plans and three assessment reports, one of each at the beginning (plan) and end (report) of each academic year.  The Academic Assessment Plans and Reports assist the program faculty in making the appropriate changes to the curriculum and the assessment tools for educational improvement. Figure 2 outlines the rhythm of the annual assessment process for programs at the University.

 

Figure 2: SSU's Annual Program-Level Assessment Process

 

 

To generate the annual Academic Assessment Plan (AAP), educational programs must discuss and define their outcomes (both POs and PSLOs) and determine specific measurable short-term action steps that enable reaching the outcomes.  Programs specify suitable methods for measuring how outcomes are being met.  Benchmarks and targets are set in order to measure the degree to which effective success and efficiency can be measured.  Data is systematically collected as appropriate and necessary for the measurement methods employed.  The collected data is shared with and reviewed by program faculty, the program assessment coordinator, and department chair to determine whether targets were met and any developmental opportunities or strategies for continuous improvement.  These results are collected and documented in the Academic Assessment Report (AAR). 

 

Beginning in spring 2017, programs will formalize their use of results in defined action steps in an Improvement Action Agenda (IAA), to be applied the following academic year.  These action items will be included explicitly in the planning for the subsequent assessment cycle.  This will allow the institution to more effectively monitor the extent to which programs are putting their assessment data to good use.  It will also allow programs to more readily report on their successes with "closing the loop" of assessment.   

 

SSU Support for Program-Level Assessment: Infrastructure, Roles, & Responsibilities

Assessment Management System (AMS) Infrastructure: LiveText

 

A strong indicator of the institution's commitment to fostering an assessment infrastructure is the investment in LiveText, a versatile Assessment Management System (AMS).  LiveText has the capability of collecting course-embedded assessments and storing rubric-based assessment data across the institution.  LiveText streamlines data collection and has analytics and reporting functionality that allow programs to instantly view progress on student learning.  There is also a Field Experience Manager that has been invaluable for efficiently collecting external mentors' and supervisors' feedback on student performance in fields like social work and teacher education.  The LiveText Assessment Insight System (AIS) domain helps to coordinate the meta-assessment functions of defining, distributing, collecting, and reviewing assessment plans and reports.

In spring 2013, SSU's QEP began to use LiveText to help manage university-wide assessment on writing with a common rubric.  Employing this technology allowed the QEP to efficiently organize yearly writing assessment sessions with teams of normed raters.  The QEP then shared best practices with rubric development and rating calibration across campus and trained many faculty members on the LiveText technology to assist in assessment practices in the programs.

Beginning in fall 2013, the university began to widely utilize the LiveText system to collect annual assessment plans and assessment report findings.  LiveText is the repository for each program's mission statement, program objectives, measurable outcomes, measures, evaluation instruments, artifacts, targets, data analysis, and action plans. The alignment to the University's mission via the strategic plan is also made explicit in this reporting system.

Human Resources: Roles and Responsibilities

While faculty engage directly in the assessment of POs and PSLOs, multiple campus offices and governance structures are brought to bear on the assessment of outcomes, to include student learning outcomes, within academic programs. This ensures that those with requisite expertise and/or leadership responsibilities are both supporting and monitoring the assessment process at SSU. 

 

Office of Institutional Research, Planning, and Assessment (IRPA). IRPA is responsible for coordinating, collecting, and disseminating information relating to planning and assessment connected to academic programs. As the coordinating office for ensuring that the University meets all standards for regional accreditation, the IRPA has the responsibility of clearly articulating criteria for assessment which must be met in both university and program assessment efforts and ensures that all departmental assessment efforts clearly demonstrate that they aremeeting these standards. The IRPA Office: (1) provides consultation to help programs develop and implement assessment plans that satisfy the required standards; (2) meets annually with program assessment leaders to provide diagnostics and consultation on assessment compliance (see Program Assessment Annual Consultation Meeting form); (3) reviews assessment reports and indicates which program assessment efforts are meeting university and accreditation criteria and, if criteria are not met, identifies specific deficiencies and communicates these to the appropriate Department Chair, Dean or division head, and to the Associate Provost; (4) supports the integration of systems through which assessment data can be gathered and analyzed, including surveys, online course assessment systems, learning management systems and/or ePortfolio systems, and the LiveText AMS; (5) maintains a repository of assessment data and assessment reports so that the university can provide evidence of systematic and comprehensive assessment of academic programs; and (6) develops and maintains reports tracking the performance of systematic assessment across all academic programs and the level of student achievement of university learning outcomes.

 

Institutional Assessment Committee (IAC). The IAC was established in fall 2013 and was tasked with helping to shape and guide assessment efforts on campus. The IAC provides advice, support, and recommendations about assessment strategies and resources to academic programs and to the Associate Provost. The IAC s ets policies, timelines, and reviews the assessment plans, reports, materials, and processes of each academic unit.

 

Committee activities include: (1) fostering a college-wide culture of assessment by communicating the implementation and operation of the institution's ongoing assessment commitments; (2) making recommendations about the assessment of student learning outcomes based on procedures outlined in SSU's Academic Assessment Guide: Purpose, Process, and Procedure ; (3) reviewing, commenting upon, and approving university assessment plans and reports; (4) identifying and communicating assessment best practices within the university community to promote the exchange and adoption of new ideas; (5) regularly reviewing the effectiveness of the academic assessment processes to support recommendations for continuous improvement; (6) providing feedback to the IRPA Assessment Coordinator regarding information and advice to academic departments and programs, administration of assessments, and decision-making to implement improved operating practices; and (7) disseminating information and advice to academic departments and programs to promote professional development that strengthens assessment effectiveness in: the design of assessment plans (particularly the selection of appropriate assessment methods), the administration of assessments, the interpretation of assessment results, and decision-making to implement improved operating practices.

 

The committee term is three years in duration and the IAC membership roster is populated with the following member types:

1.       Two program assessment coordinators from each college, who serve as voting members*

2.       One representative from the Library, who serves as a voting member*

3.       Two representatives from each non-academic division, who serve as voting members*

4.       A student representative from the Student Government Association, who serves as a voting member**

5.       The IRPA Assessment Coordinator, who serves as a non-voting member

6.       The IRPA LiveText Coordinator, who serves as a non-voting member

7.       The Associate Provost, who serves in a non-voting, advisory capacity to the committee

 

*These committee members must have an alternate to represent them in the event of a scheduling conflict.

**This position is rotated annually.

 

In addition to the broad committee charge above, the IAC involves three subcommittees, whose specific tasks are detailed in Table 1 below.

 

Table 1: IAC Subcommittee Responsibilities

 

IAC Subcommittee

Responsibilities

Non-AcademicAssessment ReviewCommittee

Provides feedback and approval to the assessment plans and reports of the non-academic service units on campus.

AcademicAssessment Review Committee

Provides feedback and approval to the assessment plans and reports of the academic degree-granting programs on campus. Reviews new curriculum proposals to ensure that there are assessment plans in place for new programs, degrees, or certificates. 

General EducationAssessment ReviewCommittee

Provides feedback and approval to the assessment plans and reports connected to ISLOs in the core curriculum and non-degree granting educational areas like First Year Experience, Freshman Composition, foreign language instruction, health and wellness courses, and Critical Thinking and Communication (a course required for graduation).

 

In its first year, the IAC developed an Assessment Glossary to establish a shared language on campus regarding assessment.  Also, it created a rubric to review 2013-2014 Baseline Assessment Reports from all areas on campus during fall 2013. Sample feedback for Political Science , Biology , and English Language and Literature are included to illustrate guidance provided to educational programs for improvements from the committee.  These baseline reports were requested to establish the assessment activities and organizational structure around student learning evaluation that was occurring on campus.  Following this audit, new procedures, delineated in the description of the assessment process above and illustrated by Figure 2 , were developed for a more systematic and uniform way of collecting and documenting assessment at SSU. 

 

While the IAC governs the entire university, each college also has its own assessment committee, and nearly all departments have assessment committees as well. The next sections will highlight the functions and the supervisory structure for assessment in Academic Affairs at SSU.

 

College Deans. The Dean of each college or school is responsible for ensuring three assessment requirements: (1) that each academic program develops and implements an assessment plan which systematically assesses student achievement of all learning outcomes associated with that prog ram; (2) that evidence of student achievement of learning outcomes is collected and retained; and (3) that faculty members analyze the evidence collected and use it to guide program improvements. In consultation with Department Chairs, Deans identify Program Assessment Coordinator(s) (see Roster of Program Assessment Coordinators ) for their college or school,who work with programs to ensure their assessment plans meet university expectations and that required data and reports are uploaded to Livetext. In the event that assessment plans or the implementation of these plans do not meet expectations, the Dean is responsible for ensuring changes are made to bring assessment plans and efforts up to a satisfactory standard. Beginning in AY 2016-2017, the Deans will preparea brief annual reports for the Provost on the state of assessment for academic programs in their respective colleges or schools, listing(a) those programs that have implemented satisfactory assessment practices, meeting the three requirements listed above, (b) any programs that are not meeting University assessment expectations, and (c) plans to remediate such programs. These reports are due by the second week of August 2017, and each subsequent August thereafter.

 

Department Chairs. Department Chairs and Program Assessment Coordinators work together to ensure that annual program objectives are set, yearly targets are identified, and progress is monitored during the process. Assessment results must be reviewed each year and discussed by the faculty to determine whether they demonstrate satisfactory student achievement of program learning outcomes. Assessment data, analysis andr esults should be documented by the program and entered into LiveText for review by the IAC and the IRPA .

 

Chairs are responsible for ensuring that all programs are developing assessment plans and reports in accordance with SSU's assessment cycle, adhering to institutional deadlines. A review of the program AAP and AAR is to be conducted by the Chair (see example of an AAP Review for Behavior Analysis). Chairs also require that the appropriate digital tools are being used to store assessment data, artifacts, plans, and reports, so that these documents will be available for future faculty and leadership. Furthermore, it is the responsibility of Chairs to ensure that all faculty are participating in the program assessment process. Chairs ensure that all appropriate information from the administration is conveyed to program assessment coordinators and faculty about the assessment process, including scheduled assessment cycle, ISLOs, and due dates for plans and reports. Finally, beginning with the 2016-2017 academic year, the Chairs are required to prepare a brief annual report for their Dean on the state of assessment for academic programs in their respective departments, listing (a)those programs that have implemented a satisfactory assessment performance,(b)any programs that are not meeting University assessment expectations, and (c) plans to remediate such programs. These reports are due by the second week of September 2017, and each subsequent September thereafter.

 

Program Assessment Coordinators (PAC). While Chairs oversee the faculty of the programs, Program Assessment Coordinators oversee the programs.  They are responsible for monitoring the curriculum and facilitating the practice of measuring the efficacy of learning occurring in the program. This duty comes with an annual stipend and the term lasts for three years.

 

Coordinators ensure: (1) the program stays on schedule with its assessment responsibilities to conduct data collection during the semester, review of data at the end of the semester, and planning for the next assessment cycle;  (2) all appropriate, data-driven changes are made to the curriculum, PSLOs, curriculum maps, and assessment tools over the course of the three year assessment period; (3) they are trained on how to generate and store all appropriate assessment documents in LiveText; (4) program faculty are using and requiring their students to submit work through the D2L Learning Management System (LMS) and the AMS; (5) that program meeting agendas and minutes related to assessment activities are generated and stored in the AMS; (6) that PSLOs are aligned with ISLOs where appropriate, to assist the administration of the university with assessment.  The alignment of the PSLOs with the ISLOs should be communicated to the IAC via the AMS; (7) that useful activities are planned for the campus Assessment Days; and (8) that in coordination with the Department Chair, assessment-related meetings are held regularly to discuss assessment results and to facilitate annual improvements. Table 2 indicates the program-related materials that the PAC is responsible for submitting each academic year:

 

 

 

Table 2: Annual Assessment Materials to be Submitted by the PAC

 

To be submitted at the beginning of the  fall semester (Assessment Plan):

To be submitted at the end of the spring semester (Assessment Report):

Include program's plan for PO measurement, including action agendas from last AY

Include program's results for PO measurement

Include program's plan for PSLO measurement, including action agendas from last AY

Include program's results for PSLO measurement

Compile list of all instrument(s)/ rubric(s) to measure the selected assignment(s)

Indicate whether performance targets for the selected PSLOs and POs were met/not met and include an evaluation of why (data analysis)

Indicate performance targets for the selected PSLOs and POs (collaboratively agreed upon with colleagues)

Specify use of results and how the data analysis will inform concrete actions for improvement (IAA)

Upload planning meeting minutes

Upload supporting documentation

The combination of the IRPA, the standing assessment committees (university-wide, college- and department-level), and Program Assessment Coordinators demonstrate the University's commitment to developing and sustaining a culture of institutional effectiveness through infrastructure. Additionally, this cadre of individuals ensures that institutional effectiveness activities are monitored, and that opportunities for feedback are provided and used for improvement.

Individual Faculty Members. Beginning fall 2016, all faculty members are expected to identify and report how their assessment activities each semester connect and feed into review of student learning performance to benefit the program or institution. This is the responsibility of each faculty member, individually, as an educator in an institution of higher education. Even if this information is not ultimately included in a formal program assessment plan or report for the major, it is important information to collect and discuss, nonetheless, for the purposes of improving teaching and for informing conversations about learning in the program and at the university, broadly construed.  Assessment information must be entered into the LiveText Assessment Management System, adhering to institutional deadlines. Regular LiveText trainings are available to assist faculty.  Assessment compliance and attendance at mandatory assessment meetings will factor into annual performance evaluations for faculty.

 

 

 

Table 3: Annual Assessment Materials to be Submitted by Individual Faculty Members

 

To be submitted at the beginning of the fall and spring semesters ( Individual Faculty Assessment Plan ):

To be submitted at the end of the fall semester ( Individual Faculty Assessment Data Report for Fall Semester ):

To be submitted at the end of the spring semester ( Individual Faculty Assessment Data Report for Spring Semester ):

Professor, Semester, Courses

Screenshot/download of the data results generated by LiveText

Screenshot/download of the data results generated by LiveText

Selected Assignment(s) to be assessed + justification for each

Analysis of Results

Analysis of Results

Instrument(s)/ Rubric(s) to measure the selected assignment(s) for each course

Submission of student work

Submission of student work

Performance Targets for the selected assignment(s) (# of students expected to achieve x level of skill)

Improvement Action Agenda for the next year (considering data from this fall and spring semesters) + justification for the improvement recommendations/actions

 

Effective assessment demonstrates thoughtful use of results for improved student learning. By ensuring that all faculty participate in course-level assessments that measure, reflect upon, and demonstrate improvements to student learning, we are enriching our assessment data pool and allowing more flexibility for our PACs in generating the assessment plans and reports that are required for accreditation at the program level. Additionally, the introductory and mid-level assessments, which come as a result of all faculty doing assessment as part of their regular semester activities, benefit the accreditation plans and reports as well as curricular reviews that are part of assessment practices for accreditation and Georgia Board of Regents mandates.

Section 2: Guide to Academic Assessment Plans & Reports

The Academic Assessment Plan (AAP) template and the Academic Assessment Report (AAR) template are used to collect information from programs to demonstrate compliance with 3.3.1.1.  As the templates show, each program is required to develop objectives and targets that align to the institutional strategic plan.  They are to identify enabling strategies that will allow them to reach their PO and PSLO outcomes, identify assessment measures, and the use of results for improvement.

AAPs outline each program's intended assessment plan for the upcoming academic year (fall/spring semesters).  It includes the identification of the program and institutional student learning outcomes to be analyzed; the courses, assignments, and assessment instruments to be used; and the targets to be met.

In each program, the program coordinator collaborates with program faculty to determine the assessment plan for the upcoming academic year.  These plans draw on the results of the previous year's report, consult the current curriculum mapping of the program student learning outcomes, and identify the assessment plan changes that are indicated and ensure that each program is maintaining effective and complete assessment each academic year.

AARs show the results of each program's actual assessment efforts for the completed academic year. It details any changes made to the mission statement or program student learning outcomes during the course of the year; the assessment methods employed (direct and indirect) to analyze student learning outcomes; a determination of whether the intended targets were met, along with evidence to substantiate the determination; proof that there was use of assessment results to improve approaches to student learning in the program; and indications of additional plans for continued improvement, especially in cases were targets are not met or are only partially met.

Table 6 provides AARs that contain information about PSLOs and POs, as well as supporting materials for the past three years (2013-2016).  It also includes an evaluative audit of the most recently completed assessment year (AY 2015-2016), displaying the number of PSLOs per program , identification of measures and methods to assess outcomes (coded by assessment type), collection of data/evidence and use of results (coded by change type), and evaluation of assessment report (coded by rating:  emerging,  maturing, or accomplished). Providin g the entire three-year assessment cycle demonstrates that assessment has been ongoing and improving over time in all educational programs.

Assessment expertise across programs is best characterized on a scale from emerging to maturing to accomplished (see Table 5 for definitions).  However, the institution is involved in an assessment improvement process that has seen significant gains.  Currently, SSU's 2015-2016 Assessment Reports are rated at 7% accomplished, 42% maturing, and 51% emerging ( see Table 6 ). The goal for 2021 is to have assessment reports reach the following rating targets: at least 25% accomplished, 65% maturing, and no more than 10% emerging. Since SSU plans to expand programs, associate degrees, and certificates in the next few years, we anticipate having some new programs that will still reasonably fall in the emerging category.  

Table 5 defines the codes for assessment and improvement change type, as well as the ratings of the assessment activities for each program exemplified in their 2015-2016 AY Assessment Reports.  As the ratings indicate, although all programs have engaged in assessment activities, we are on a continuum of achievement. 

Table 5: Codes and Ratings for 2015-2016 Assessment Reports

Codes for Assessment Type in Instructional Programs [2]

Codes for Change Type in Instructional Programs

  1. Research Project or Paper
  2. Capstone Project
  3. Quiz
  4. Examination
  5. Standardized Test
  6. Reflective Homework
  1. Presentation or Exhibition
  2. Internship or Practicum
  3. Thesis
  4. Survey or Questionnaire
  5. Lab Report
  6. Other
  1. Improvement in Pedagogy
  2. Modified Assessment Method
  3. Revision to Course
  4. Amended Program Curriculum
  5. Enhanced Process or Service
  6. Implementation of New Process or Service
  7. Other

RATINGS FOR ASSESSMENT REPORTS [3]

 

EMERGING:

Four or five first tier elements are met.

 

MATURING :

All six first tier elements are met.

First Tier Elements       

1.    Mission statement describes the primary purpose, functions, and stakeholders of the program.

2.     Report reflects an assessment process that describes the program or unit's assessment strategy; how that strategy is translated into outcomes and measures; and the process for reviewing, analyzing, and applying assessment data for program improvement.

3.     Number of outcomes: a minimum of three POs and three PSLOs for each program are included in the plan.

4.     Number and type of measures a minimum of two appropriate, quantitative measures, at least one of which is a direct measure.

5.     Measures for the outcomes that meet the minimum requirements and establish specific performance targets.

6.     Specific assessment instruments and student work samples are made available as supporting documentation.

ACCOMPLISHED:

All six first tier elements are met + at least one second tier element is met.

Second Tier Elements

7.    The report clearly focuses on formative assessment to promote continuous quality improvement (e.g., establishes baseline data, sets stretch targets based on past performance, etc.).

8.   The report builds on previous assessment by including at least one measure to assess the impact of an implemented change, demonstrating a "closed loop" IE Assessment process.

Table 6: Institutional Effectiveness Activities by Educational Program: 2013 – 2016

Program Name

Assessment Reports

Analysis of 2015-2016 Assessment Reports

2013-2014

2014-2015

2015-2016

Number of Program Student Learning Outcomes

Identification of Measures and Methods to Assess Outcomes (Coded by   Assessment Type)

Collection of Data Evidence +

Use of Results (Coded by Change Type)

Evaluation of Assessment Report (Coded by Rating:  Emerging,  Maturing, or Accomplished)

School of Teacher Education

Bachelor of Science in Biology (with

a secondary teacher certification track)

2013-2014

2014-2015

2015-2016

5

a,d,f,g,l

i,ii,iii,iv,vi

Maturing

Bachelor of Science in Civil Engineering Technology (track in technology education)

2013-2014

2014-2015

2015-2016

5

a,d,e,f,g,l

i,ii,iii,iv,vi

Maturing

Bachelor of Science in Electronics Engineering Technology (track in technology education)

2013-2014

2014-2015

2015-2016

5

a,d,e,f,g,l

i,ii,iii,iv,vi

Maturing

Bachelor of Science in Mathematics (with a secondary teacher certification track)

2013-2014

2014-2015

2015-2016

5

a,d,e,f,g,l

i,ii,iii,iv,vi

Maturing

College Of Liberal Arts And Social Sciences

Bachelor of Arts  in Africana Studies

2013-2014

2014-2015

2015-2016

3

a,c,f

ii,v,vi

Emerging

Bachelor of Arts in English Language and Literature

2013-2014

2014-2015

2015-2016

5

a,d

i,ii,iii,iv,vi

Accomplished

Bachelor of Arts  in History, General

2013-2014

2014-2015

2015-2016

4

a,f

ii,vi

Emerging

Bachelor of Arts  in Homeland Security and Emergency Management

2013-2014

2014-2015

2015-2016

4

a,g,i

Vi

Emerging; recommend more improvement types

Bachelor of Arts  in Mass Communications (Core and Tracks in: 1. Online Journalism; 2. Public Relations and Advertising; 3. Audio/Visual)

2013-2014

Track 1 Track 2 Track 3

2014-2015

Track 1 Track 2 Track 3

2015-2016

Track 1 Track 2 Track 3

1. 12

2. 12

3. 12

1. a,f,l

2. a,b,d,f,l

3. a,d,f,l

   1. i,iii,v,vi

   2. i,iii,v,vi

   3. i,ii,iii,iv,v,vi

1. Maturing

2. Maturing

3. Emerging

Bachelor of Fine Arts in Visual and Performing Arts

2013-2014

2014-2015

2015-2016

5

g,h,l

Ii

Maturing; recommend more improvement types

Bachelor of Interdisciplinary Studies

N/A (new)

2014-2015

2015-2016

4

a,j

N/A

Emerging

Bachelor of Science in Behavior Analysis

2013-2014

2014-2015

2015-2016

6

c,d,f,k

ii,vi

Accomplished

Bachelor of Science in Criminal Justice

2013-2014

2014-2015

2015-2016

4

a,d

Vi

Emerging; recommend more improvement types

Bachelor of Science in Political Science

2013-2014

2014-2015

2015-2016

5

a,f

i,ii,iv,v

Maturing

Bachelor of Science in Sociology

2013-2014

2014-2015

2015-2016

4

c,d,f

ii,iii,vi

Emerging

Bachelor of Social Work

2013-2014

2014-2015

2015-2016

9

a,f,g,l

i,iii,iv,vi,vii

Accomplished

Master of Public Administration

2013-2014

2014-2015

2015-2016

7

a,b,f,l

ii,iii,v

Emerging

Master of Science in Urban Studies and Planning

2013-2014

2014-2015

2015-2016

8

A

Vi

Emerging; recommend more measure-ment types

Master of Social Work

2013-2014

2014-2015

2015-2016

9

 

a,d,f,g,h,l

 

ii,iii,v,vi

Emerging

Post-Baccalaureate Certificate In Nonprofit Organization and Leadership

N/A (no grads)

n/a

n/a

n/a

Assessment Plan in Place

Certificate of Less than One Year In Homeland Security & Emergency Management

N/A (no grads)

n/a

n/a

n/a

Assessment Plan in Place

College of Business Administration

Bachelor of Business Administration  in Accounting

2013-2014

2014-2015

2015-2016

5

a,d,l

i,iii,iv

Maturing

Bachelor of Business Administration in Global Logistics and International Business

N/A (new)

2014-2015

2015-2016

5

d,f,g,l

I

Emerging;  recommend more improvement types

Bachelor of Business Administration in  Information Systems

2013-2014

2014-2015

2015-2016

6

d,f

i,iii,iv

Emerging

Bachelor of Business Administration in Management (Traditional & Online)

2013-2014

2014-2015

2015-2016

6

d,f,l

Ii

Maturing;  recommend more improvement types

Bachelor of Business Administration in Marketing

2013-2014

2014-2015

2015-2016

6

a,d,g,l

Ii

Maturing;  recommend more improvement types

Master of Business Administration

2013-2014

2014-2015

2015-2016

6

d,g,l

ii,vi

Emerging

College of Sciences and Technology

Bachelor of Science in Biology

2013-2014

2014-2015

2015-2016

5

c,d,e

i,ii,iii

Emerging

Bachelor of Science in Chemistry

2013-2014

2014-2015

2015-2016

5

a,c,d,f

i,iii,iv

Maturing

Bachelor of Science in Civil Engineering Technology

2013-2014

2014-2015

2015-2016

8

a,c,d

i,vi,vii

Maturing

Bachelor of Science in Forensic Science

2013-2014

2014-2015

2015-2016

4

d,f,l

i,ii,iii,iv

Maturing

Bachelor of Science in Computer Science Technology

2013-2014

2014-2015

2015-2016

11

a,d,f,l

N/A

Maturing

Bachelor of Science in Electronics Engineering Technology

2013-2014

2014-2015

2015-2016

5

a,d,f,g,k,l

V

Maturing;  recommend more improvement types

Bachelor of Science in Environmental Science

2013-2014

2014-2015

2015-2016

6

a,d,f,g,k,l

Vii

Emerging;  recommend more improvement types

Bachelor of Science in Marine Sciences

2013-2014

2014-2015

2015-2016

4

a,d,f,g,k

i,iii,iv,vi

Maturing

Bachelor of Science in Mathematics

2013-2014

2014-2015

2015-2016

6

a,d

iii,vi

Emerging

Master of Science in Marine Sciences

2013-2014

2014-2015

2015-2016

5

a,d,g,i,k

i,iii,iv,vi

Emerging

Associate of Science

(Tracks in

1. Pre-Physics;

2.  General Technology;

3. Engineering Studies;

4. Aquarium Science;

5. Health Science)

2013-2014:

Track 1 Track 2 Track 3 Track 4 Track 5

2014-2015:

Track 1 Track 2 Track 3 Track 4 Track 5

2015-2016:

Track 1 Track 2 Track 3 Track 4 Track 5

1. 2

2. 2

3. 2

4. 4

5. 3

   1. g,l

   2. a,d,f

   3. a,d

   4. a,d,f,g,h,k,l

   5. a,d

   1. n/a

   2. vi

   3. n/a

   4. i,ii,iii,iv,vi

   5. i,iii,iv

1. Emerging

2. Emerging

3. Emerging

4. Emergi ng

5. Emerging

 

Assessment of the Online BBA in Management

Currently, SSU has one fully online degree program. The Online BBA offers the upper 60 credits of the BBA in Management and targets graduates of two-year programs and individuals who have completed 60 or more credit hours but have not yet earned a degree.  This program offers a great opportunity for adult learners to complete their degree, especially active-duty military personnel in our region. In addition, this program is offered as part of our commitment to "Complete College Georgia," an initiative to create a more educated Georgia.  

The curriculum and faculty are identical between the traditional BBA and the Online BBA — the only difference between the learning experience is the delivery method of the course.  Thus, the assessment plan for POs and PSLOs are the same for the two programs, and results for both are included within the COBA Management files located in Table 6 .   An analysis of the results shows that: (a) there is no significant difference between student performance in the face to face courses versus those taking courses online; (b) because the design of the program accepts students that have completed at least 60 credits, assessments done in lower level courses have a small number of Online BBA students in them; and (c) since the Online BBA program is relatively new, the small number of enrolled students makes a meaningful comparison difficult at this juncture.  As the program grows, more analysis will be done to examine performance trends.

Section 3: Recent Improvements and Upcoming Actions to Improve SSU's Assessment Process

SSU is striving to create a culture of assessment that extends beyond mere compliance for accreditation mandates, where faculty come to understand the intrinsic value of assessment as a cyclical process designed to improve both teaching and learning.  The improvements to assessment practices at SSU have fallen into three main categories: (1) building capacity and infrastructure, including mechanisms for better data collection and organization of information; (2) providing training and development; and (3) creating an environment where educational research is valued and supported.

Capacity-Building and Infrastructure Enhancement

The creation of the IAC was an important step for bringing campus awareness, representation, and uniformity to assessment practices.  Likewise, the adoption of the LiveText AMS was critical for making assessment practice part of our semester work flow.  Adding two structured Assessment Days to the academic calendar in fall 2016 indicates the level of commitment the institution has for providing the time and space for thoughtful, reflective conversations to foster proper planning, reporting, and discussion of program assessment results. Making assessment activities part of the faculty handbook, faculty contracts, and performance review criteria has also elevated the role of assessment on campus.  The institution found that we needed better mechanisms for collecting information and helping faculty to begin more nuanced examinations of improvements, so many forms were created to provide leading questions to help provide deeper justifications (for example: Individual Faculty Assessment Plan , Individual Faculty Assessment Data Report for Fall Semester , Individual Faculty Assessment Data Report for Spring Semester ). SSU also improved the timeliness, frequency, and amount of feedback for programs on their assessment activities from Department Chairs, the IAC and the IRPA and is now requiring annual state of assessment reports to be submitted to the Provost.  Finally, the upcoming restructuring of the Office of Institutional Research, Planning, and Assessment to become the Office of Institutional Effectiveness, Planning, and Assessment, along with the hire of a new Assistant Vice President to supervise this unit, is a significant asset. 

Training and Development Opportunities

In order to build an improved culture of assessment on campus, SSU has always been dedicated to providing development opportunities on campus.  In fact, between fall 2012 and fall 2016, there have been 65 university-wide assessment training opportunities and assessment working group sessions . Additionally, after the Institutional Assessment Committee voted to confirm the measurement of SSU's ISLOs using the American Association of Colleges and University (AACU) VALUE Rubrics in February 2016, the Office of the QEP arranged to have the AACU's Senior Director of Research and Assessment come to campus to provide faculty development sessions on rubrics and to consult with eleven educational programs across all colleges on ways to improve their assessment plans and activities. These efforts have been seminal to the development of SSU's approach to assessing academic programs thus far. However, to reach our goal of truly creating a culture of quality assessment, we have learned that we need to continue to create more opportunities where dissemination of results can occur. The institution needs to continue to train and develop PACs and program faculty on good practices around assessment and to reinforce and improve mechanisms for getting faculty to reflectively report on their assessment activities and successes.

Creating an Environment Where Educational Research is Valued and Supported

The transformation of the Office of the QEP to the Center for Teaching Excellence and Faculty Development (see QEP Impact Report for more detail) will provide an important location on campus for faculty to generate and complete educational research projects and to become trained peer mentors on assessment initiatives.  There are plans to have Faculty Assessment Fellows who will serve as guides in the colleges to assist with assessment development in the programs and to support individual faculty interests with the scholarship of teaching, learning, and assessment (SoTLA). Harnessing these faculty assessment champions will allow us to do better outreach for new faculty during orientation, will provide more development training opportunities for the Metric Mondays series, will provide more structured facilitation at Assessment Day events, a nd will contribute to upcoming Assessment Symposia (to be held during SSU's Faculty and Staff Institutes).

SSU is actively working to build stronger measures and practices to accomplish the kind of quality assessment we are striving for.  As an institution, our commitment to building upon the forward movement we have cultivated is steadfast because reinforcing and strengthening a culture of assessment at SSU is essential for continuous improvement of student learning and the maintenance of high quality educational results.

 



[1] These ISLOs are measured with the QEP Writing Rubric and the American Association of Colleges and Universities' VALUE Rubrics . Reporting on the ISLO results will be included in the CS 3.5.1 section of the 2021 reaffirmation report to SACSCOC.  

[2] This coding by type (assessment and change) is modeled on Valdosta State University's 5th Year Report for SACSCOC.

[3] This scale is adapted from University of Central Florida's "Institutional Effectiveness Assessment Plan Rubric," https://assessment.ucf.edu/doc/Institutional_Effectiveness_Assessment_Rubrics.pdf.